Efficiency of Coordinate Descent Methods for Structured Nonconvex Optimization
نویسندگان
چکیده
Novel coordinate descent (CD) methods are proposed for minimizing nonconvex functions consisting of three terms: (i) a continuously differentiable term, (ii) simple convex and (iii) concave continuous term. First, by extending randomized CD to nonsmooth settings, we develop subgradient method that randomly updates block-coordinate variables using block composite mapping. This converges asymptotically critical points with proven sublinear convergence rate certain optimality measures. Second, permuted two alternating steps: linearizing the part cycling through variables. We prove asymptotic complexity objectives both smooth parts. Third, extend accelerated (ACD) optimization novel proximal DC algorithm whereby solve subproblem inexactly ACD. Convergence is guaranteed at most few number ACD iterations each subproblem, established identification some approximate points. Fourth, further third minimize ill-conditioned functions: weakly high Lipschitz constant negative curvature ratios. show that, under specific criteria, ACD-based has superior compared conventional gradient methods. Finally, an empirical study on sparsity-inducing learning models demonstrates gradient-based large-scale problems.
منابع مشابه
Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
In this paper we propose new methods for solving huge-scale optimization problems. For problems of this size, even the simplest full-dimensional vector operations are very expensive. Hence, we propose to apply an optimization technique based on random partial update of decision variables. For these methods, we prove the global estimates for the rate of convergence. Surprisingly enough, for cert...
متن کاملEfficient random coordinate descent algorithms for large-scale structured nonconvex optimization
In this paper we analyze several new methods for solving nonconvex optimization problems with the objective function formed as a sum of two terms: one is nonconvex and smooth, and another is convex but simple and its structure is known. Further, we consider both cases: unconstrained and linearly constrained nonconvex problems. For optimization problems of the above structure, we propose random ...
متن کاملEfficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems
In this paper we prove a new complexity bound for a variant of Accelerated Coordinate Descent Method [7]. We show that this method often outperforms the standard Fast Gradient Methods (FGM, [3, 6]) on optimization problems with dense data. In many important situations, the computational expenses of oracle and method itself at each iteration of our scheme are perfectly balanced (both depend line...
متن کاملSparseNet: Coordinate Descent With Nonconvex Penalties.
We address the problem of sparse selection in linear models. A number of nonconvex penalties have been proposed in the literature for this purpose, along with a variety of convex-relaxation algorithms for finding good solutions. In this article we pursue a coordinate-descent approach for optimization, and study its convergence properties. We characterize the properties of penalties suitable for...
متن کاملParallel Coordinate Descent Methods for Big Data Optimization
In this work we show that randomized (block) coordinate descent methods can be accelerated by parallelization when applied to the problem of minimizing the sum of a partially separable smooth convex function and a simple separable convex function. The theoretical speedup, as compared to the serial method, and referring to the number of iterations needed to approximately solve the problem with h...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Lecture Notes in Computer Science
سال: 2021
ISSN: ['1611-3349', '0302-9743']
DOI: https://doi.org/10.1007/978-3-030-67664-3_5